Goto

Collaborating Authors

 Ferguson


"Monuments," Reviewed: The Confederacy Surrenders to a Truer American Past

The New Yorker

As the Trump Administration tries to rescue symbols of the Lost Cause, an exhibition in Los Angeles, led by Kara Walker, finds meaning in their desecration. Kara Walker's "Unmanned Drone" (2023) transforms a Stonewall Jackson statue. The first thing you see is a horse's ass, protruding, upside down, from the thorax of a monster. A man's arm descends from the beast's stomach, his gloved hand clutching the blade of a fallen sabre. Every part of the work comes from a statue of the Confederate general Stonewall Jackson that was removed from Charlottesville, Virginia, in 2021.


Podcasts as a Medium for Participation in Collective Action: A Case Study of Black Lives Matter

Moldovan, Theodora, Pera, Arianna, Vega, Davide, Aiello, Luca Maria

arXiv.org Artificial Intelligence

We study how participation in collective action is articulated in podcast discussions, using the Black Lives Matter (BLM) movement as a case study. While research on collective action discourse has primarily focused on text-based content, this study takes a first step toward analyzing audio formats by using podcast transcripts. Using the Structured Podcast Research Corpus (SPoRC), we investigated spoken language expressions of participation in collective action, categorized as problem-solution, call-to-action, intention, and execution. We identified podcast episodes discussing racial justice after important BLM-related events in May and June of 2020, and extracted participatory statements using a layered framework adapted from prior work on social media. We examined the emotional dimensions of these statements, detecting eight key emotions and their association with varying stages of activism. We found that emotional profiles vary by stage, with different positive emotions standing out during calls-to-action, intention, and execution. We detected negative associations between collective action and negative emotions, contrary to theoretical expectations. Our work contributes to a better understanding of how activism is expressed in spoken digital discourse and how emotional framing may depend on the format of the discussion.


The Download: Chinese LLMs, and transforming heavy-duty trucking

MIT Technology Review

When police departments first started buying and deploying bodycams in the wake of the police killing of Michael Brown in Ferguson, Missouri, a decade ago, activists hoped it would bring about real change. Years later, despite what's become a multibillion-dollar market for these devices, the tech is far from a panacea. Most footage they generate goes unwatched. And if they do finally provide video to the public, it usually doesn't tell the complete story. A handful of AI startups see this problem as an opportunity to create what are essentially bodycam-to-text programs for different players in the legal system, mining this footage for misdeeds. But like the bodycams themselves, the technology still faces procedural, legal, and cultural barriers to success.


The Download: the problem with police bodycams, and how to make useful robots

MIT Technology Review

When police departments first started buying and deploying bodycams in the wake of the police killing of Michael Brown in Ferguson, Missouri, a decade ago, activists hoped it would bring about real change. Years later, despite what's become a multibillion-dollar market for these devices, the tech is far from a panacea. Most of the vast reams of footage they generate go unwatched. And if they do finally provide video to the public, it's often selectively edited, lacking context and failing to tell the complete story. A handful of AI startups see this problem as an opportunity to create what are essentially bodycam-to-text programs for different players in the legal system, mining this footage for misdeeds.


Should Local Police Departments Deploy Lethal Robots?

The New Yorker

Last month, the San Francisco Board of Supervisors voted in favor of allowing that city's police department to deploy robots equipped with a potential to kill, should a situation--in the estimation of police officers--call for lethal force. With that decision, the board appeared to have delivered the city to a dystopian future. The vote garnered a loudly negative response from the public, and this week the supervisors reversed course and sent the policy back to committee. But the fact that the decision initially passed--and may yet pass in some form--should not have been surprising. Police departments around the country have been acquiring robotic devices for decades.


San Francisco will allow police to deploy robots that kill

#artificialintelligence

Supervisors in San Francisco voted Tuesday to give city police the ability to use potentially lethal, remote-controlled robots in emergency situations -- following an emotionally charged debate that reflected divisions on the politically liberal board over support for law enforcement. The vote was 8-3, with the majority agreeing to grant police the option despite strong objections from civil liberties and other police oversight groups. Opponents said the authority would lead to the further militarization of a police force already too aggressive with poor and minority communities. Supervisor Connie Chan, a member of the committee that forwarded the proposal to the full board, said she understood concerns over use of force but that "according to state law, we are required to approve the use of these equipments. So here we are, and it's definitely not a easy discussion."


San Francisco approves police proposal to use potentially deadly robots

The Guardian

Police in San Francisco will be allowed to deploy potentially lethal, remote-controlled robots in emergency situations. The controversial policy was approved after weeks of scrutiny and a heated debate among the city's board of supervisors during their meeting on Tuesday. Police oversight groups, the ACLU and San Francisco's public defender had urged the 11-member body to reject the police's use of equipment proposal. Opponents of the policy said it would lead to further militarization of a police force already too aggressive with underserved communities. They said the parameters under which use would be allowed were too vague.


What Happens When Our Faces Are Tracked Everywhere We Go?

#artificialintelligence

When a secretive start-up scraped the internet to build a facial-recognition tool, it tested a legal and ethical limit -- and blew the future of privacy in America wide open. In May 2019, an agent at the Department of Homeland Security received a trove of unsettling images. Found by Yahoo in a Syrian user's account, the photos seemed to document the sexual abuse of a young girl. One showed a man with his head reclined on a pillow, gazing directly at the camera. The man appeared to be white, with brown hair and a goatee, but it was hard to really make him out; the photo was grainy, the angle a bit oblique. The agent sent the man's face to child-crime investigators around the country in the hope that someone might recognize him. When an investigator in New York saw the request, she ran the face through an unusual new facial-recognition app she had just started using, called Clearview AI. The team behind it had scraped the public web -- social media, employment sites, YouTube, Venmo -- to create a database with three billion images of people, along with links to the webpages from which the photos had come. This dwarfed the databases of other such products for law enforcement, which drew only on official photography like mug shots, driver's licenses and passport pictures; with Clearview, it was effortless to go from a face to a Facebook account. The app turned up an odd hit: an Instagram photo of a heavily muscled Asian man and a female fitness model, posing on a red carpet at a bodybuilding expo in Las Vegas. The suspect was neither Asian nor a woman. But upon closer inspection, you could see a white man in the background, at the edge of the photo's frame, standing behind the counter of a booth for a workout-supplements company. On Instagram, his face would appear about half as big as your fingernail. The federal agent was astounded. The agent contacted the supplements company and obtained the booth worker's name: Andres Rafael Viola, who turned out to be an Argentine citizen living in Las Vegas.


Senate rejects proposed limits on transfers of military-grade weapons, gear to local police

FOX News

Fox News Flash top headlines are here. Check out what's clicking on Foxnews.com. The Senate on Tuesday rejected a bipartisan proposal to curtail the transfer of military-grade weapons and gear to local police departments. Senators voted 51-49 on the proposal, falling short of the 60 votes needed to pass. Spearheaded by Sen. Brian Schatz, D-Hawaii, the amendment to the National Defense Authorization Act (NDAA) proposed limiting tracked combat vehicles, armed drones, grenade launchers and tear gas to local police departments across the U.S. U.S. Sens. Brian Schatz, D-Hawaii, left, and Dick Durbin, D-Ill., attend a news conference on defunding military projects to pay for the border wall on Capitol Hill.


The Racist Roots of New Technology

#artificialintelligence

Race After Technology opens with a brief personal history set in the Crenshaw neighborhood of Los Angeles, where sociologist Ruha Benjamin spent a portion of her childhood. Recalling the time she set up shop on her grandmother's porch with a chalkboard and invited other kids to do math problems, she writes, "For the few who would come, I would hand out little slips of paper…until someone would insist that we go play tag or hide-and-seek instead. Needless to say, I didn't have that many friends!" As she gazed out the back window during car rides, she saw "boys lined up for police pat-downs," and inside the house she heard "the nonstop rumble of police helicopters overhead, so close that the roof would shake." The omnipresent surveillance continued when she visited her grandmother years later as a mother, her homecomings blighted by "the frustration of trying to keep the kids asleep with the sound and light from the helicopter piercing the window's thin pane." Benjamin's personal beginning sets the tone for her book's approach, one that focuses on how modern invasive technologies--from facial recognition software to electronic ankle monitors to the metadata of photos taken at protests--further racial inequality.